Firework Injury: Markov Model
Introduction and Overview of Decision Problem
This case study will build on the initial analysis of firework-related injuries in Columbia using a decision tree model. We will extend our evaluation by employing a Markov cohort model. While the decision tree provided immediate outcomes and costs for various strategies, the Markov model offers significant advantages for examining long-term health and cost impacts.
Our Markov model will allow us to capture the progression of firework-related injuries over an extended period, allowing us to consider long-term implications. By modeling injury events and transitions between different health states over time, we can better understand the chronic effects of injuries and the long-term benefits of intervention strategies.
This approach will provide a more comprehensive assessment of the lifetime health and economic impacts of the proposed strategies, considering factors such as long-term healthcare costs, changes in injury rates, and compliance levels. The Markov model’s ability to incorporate these extended horizons and recurring events will yield deeper insights into the most effective and sustainable approaches to mitigating firework-related injuries in Columbia.
Alive-Dead Model
We will start by constructing a simple Markov model to represent the progression of firework-related injuries in Columbia. Our initial model will consist of two health states: “Alive” and “Dead.” We will assume that individuals can transition between these states based on the probabilities of surviving as calculated using life table data.
Markov Structure: Alive-Dead
The structure of the Markov model for the Alive-Dead model is shown in the figure below. Amua has a special Markov node (represented by ). The branches that lead off a Markov node designate all the Markov states (and only Markov states).
In this example, there are two health states: (1) Alive, (2) Dead. Off each health state, you can create a subtree (also called a cycle tree) that reflects those events that can occur during a cycle. The last branch at the end of each pathway will be a state transition, which defines what state to go to for the next cycle.
Note that in a Markov model, outcomes are defined elsewhere – NOT at the end of the branch, but at the state.
Building the Tree
Structure
After you open Amua, click Model New Markov Model.
Save your model right at the start.
Now select the decision node
, Right-click Add
Markov Chain .
Start by developing the structure of the Markov model using Alive and Dead for the different health states. The branches of the Markov chain should correspond to the states of the model. Label the name option to the right of the decision node as Alive-Dead.
Now, complete the structure of the Markov model using the information above. Note: focus on adding the branches and transitions for this step; Parameters will be added later in this lab.
When you reach the end of the branch, select the chance node you would like to turn into a state transition, right click, select Change to State Transition. This will give you the blue arrow
. On the right of this arrow, you can find a dropdown menu with the different health states you specified. Select the health state this part of the cohort will transition to.
With this button
you can align the end nodes.